|
Metric typographic units have been devised and proposed several times to overcome the various traditional point systems. After the French revolution of 1789 one popular proponent of a switch to metric was Didot, who had been able to standardise the continental European typographic measurement a few decades earlier. The conversion did not happen, though. The ''Didot point'' was metrically redefined as m in 1879 by Berthold. With the introduction of phototypesetting in the 1970s metric units were increasingly used in typography. The ''Didot point'' was redefined once again to 375 μm exactly ( mm). Furthermore a new unit, the quart (''quarter millimetre'', abbreviated ‘q’) of 250 μm ( mm) was devised. Otl Aicher vividly encouraged the use of the quart, also providing a suggested list of common sizes. Note that Aicher’s font sizes are based on the DIN standard then in development, which uses the H-height, whereas in lead typesetting the larger cap height was used. Some typographers have proposed using the x-height instead, because the psychological size depends more on the size of default, lowercase letters. The advent and success of desktop publishing (DTP) software and word processors for office use, coming mostly from the non-metric United States, basically revoked this metrication process in typography. DTP commonly uses the PostScript point, which is defined as of an inch (352.(7) μm). ==Device resolutions== The resolution of computer screens is often denoted in millimetres pitch, whereas office printers are usually denoted reciprocally in dots per inch (‘dpi’, ‘d/in’). Phototypesetters have long used micrometres. To convert dpi resolution to μm resolution, the formula to be used is ', where ''R'' is the resolution in dpi. So for example 96 dpi translates to a resolution of 265 μm. The (CSS3 media queries ) draft introduces a unit dpcm (dots per centimetre) for resolution. 抄文引用元・出典: フリー百科事典『 ウィキペディア(Wikipedia)』 ■ウィキペディアで「Metric typographic units」の詳細全文を読む スポンサード リンク
|